# Central limit theorem.
Suppose we repeat an experiment $n$ times independently, and take their average. What is the distribution of this average value? As it turns out, as $n$ tends to infinity, this distribution approaches a normal distribution. This is central limit theorem.
More precisely, let $X_{1},X_{2},\ldots,X_{n}$ be $n$ identical and independent random variables (iid), with some density $F$ where $E(X)=0$ and $E(X^{2}) = \sigma^{2}$. If we write $$
Z_{n} = \frac{X_{1}+X_{2}+\cdots+X_{n}}{\sigma \sqrt{n}}
$$then $$
P(Z_{n} \le x) \to \Phi(x)
$$
where $\Phi(x)$ is the standard normal distribution with density $\phi(x) = \frac{1}{\sqrt{2\pi}} e^{-x^{2}/2}$.
To see this, we show that $Z_{n}$ has a moment generating function that converges to the moment generating function of a standard normal distribution.
## Moment generating function.
Define for a random variable $X$ with density $f(x)$, its moment generating function to be $$
M_{X}(t) = E(e^{tX})
$$
In the case where $X$ is discrete, then we have $$
M_{X}(t) = \sum_{x} e^{tx}f(x)
$$And in the case where $X$ is continuous, then we have $$
M_{X}(t) = \int_{-\infty}^{\infty} e^{tx} f(x) dx .
$$
One familiar with Laplace transform may see some resemblance.
Assuming convergence, we have $$
M_{X}(t) = E(e^{tX}) = E\left( \sum_{n\ge0} \frac{t^{n}X^{n}}{n!} \right) = \sum_{n\ge 0} \frac{E(X^{n})}{n!}t^{n}
$$So the coefficients of $M_{X}$ gives the moments of $X$, divided by an appropriate factorial factor.
We observe that if $X\sim\text{normal}(0,1)$, then its moment generating function is $$
M_{X}(t) = \int_{-\infty}^{\infty} e^{tx} \frac{1}{\sqrt{2\pi}} e^{-x^{2} / 2} dx = e^{t^{2}/2}
$$by completing the square.
We note also that if $X$ and $Y$ are independent random variables, then $$
M_{X+Y}(t) = \int_{y=-\infty}^{\infty}\int_{x=-\infty}^{\infty} e^{t(x+y)}f_{X}(x)f_{Y}(y)dxdy=M_{X}(t)M_{Y}(t)
$$by Fubini. Also $$
M_{cX}(t) = \int_{-\infty}^{\infty} e^{tcx} f(x) dx = M_{X}(ct).
$$
Finally we make a bold claim that if two distributions have the same moment generating functions, then they are essentially the same.
## Central limit theorem
Let $X_{1},\ldots,X_{n}$ be identical independent random variables with average $\mu = 0$ and variance $\sigma^{2}$.
Denote $M(t)$ to be the moment generating function for each single $X_{i}$.
Then for $S_{n} = X_{1} + X_{2} + \cdots + X_{n}$, it has moment generating function $$
M_{S_{n}}(t) = M(t)^{n}
$$
And for $Z_{n} = \frac{S_{n}}{\sigma \sqrt{n}}$, it has moment generating function $$
M_{Z_{n}}(t) = M_{S_{n}}\left( \frac{t}{\sigma \sqrt{n}} \right) = \left[M\left( \frac{t}{\sigma \sqrt{n}} \right)\right]^{n}.
$$
Now, assuming $M(t)$ is analytic at $t = 0$, and with $E(X)= \mu =0$, $E(X^{2})=\sigma^{2}$, we have $$
M(t) = 1 + \frac{1}{2}\sigma^{2}t^{2} + \epsilon(t)
$$with $$
\frac{\epsilon(t)}{t^{2}} \to 0 \text{ when } t\to 0
$$by Taylor's theorem.
So, $$
M\left( \frac{t}{\sigma \sqrt{n}} \right) = 1 + \frac{1}{2} \sigma^{2} \frac{t^{2}}{\sigma^{2}n} + \epsilon\left( \frac{t}{\sigma \sqrt{n}} \right) = 1 + \frac{1}{2n} t^{2} + \epsilon\left( \frac{t}{\sigma \sqrt{n}} \right)
$$where $$
\frac{\sigma^{2}n}{t^{2}} \epsilon\left( \frac{t}{\sigma \sqrt{n}} \right) \to 0 \text{ as } n\to\infty.
$$I
n particular $$
n \cdot \epsilon\left( \frac{t}{\sigma \sqrt{n}} \right) \to 0 \text{ as } n\to\infty.
$$
Now take the $n$-th power, $$
\begin{align*}
M_{Z_{n}}(t) &= \left[M\left( \frac{t}{\sigma \sqrt{n}} \right)\right]^{n} = \left[1 + \frac{1}{2n} t^{2} + \epsilon\left( \frac{t}{\sigma \sqrt{n}} \right)\right]^{n} \\\
\end{align*}
$$
Now recall if $a_{n} \to a$, then $$
\left( 1+\frac{a_{n}}{n} \right)^{n} \to e^{a}
$$Hence as $$
\frac{t^{2}}{2} + n \cdot \epsilon \left( \frac{t}{\sigma \sqrt{n}} \right) \to \frac{t^{2}}{2}
$$as $n\to \infty$, we have $$
M_{Z_{n}}(t) \to e^{t^{2/2}}
$$which is the moment generating function of the standard normal! $\blacksquare$